640 research outputs found

    Safe code transfromations for speculative execution in real-time systems

    Get PDF
    Although compiler optimization techniques are standard and successful in non-real-time systems, if naively applied, they can destroy safety guarantees and deadlines in hard real-time systems. For this reason, real-time systems developers have tended to avoid automatic compiler optimization of their code. However, real-time applications in several areas have been growing substantially in size and complexity in recent years. This size and complexity makes it impossible for real-time programmers to write optimal code, and consequently indicates a need for compiler optimization. Recently researchers have developed or modified analyses and transformations to improve performance without degrading worst-case execution times. Moreover, these optimization techniques can sometimes transform programs which may not meet constraints/deadlines, or which result in timeouts, into deadline-satisfying programs. One such technique, speculative execution, also used for example in parallel computing and databases, can enhance performance by executing parts of the code whose execution may or may not be needed. In some cases, rollback is necessary if the computation turns out to be invalid. However, speculative execution must be applied carefully to real-time systems so that the worst-case execution path is not extended. Deterministic worst-case execution for satisfying hard real-time constraints, and speculative execution with rollback for improving average-case throughput, appear to lie on opposite ends of a spectrum of performance requirements and strategies. Deterministic worst-case execution for satisfying hard real-time constraints, and speculative execution with rollback for improving average-case throughput, appear to lie on opposite ends of a spectrum of performance requirements and strategies. Nonetheless, this thesis shows that there are situations in which speculative execution can improve the performance of a hard real-time system, either by enhancing average performance while not affecting the worst-case, or by actually decreasing the worst-case execution time. The thesis proposes a set of compiler transformation rules to identify opportunities for speculative execution and to transform the code. Proofs for semantic correctness and timeliness preservation are provided to verify safety of applying transformation rules to real-time systems. Moreover, an extensive experiment using simulation of randomly generated real-time programs have been conducted to evaluate applicability and profitability of speculative execution. The simulation results indicate that speculative execution improves average execution time and program timeliness. Finally, a prototype implementation is described in which these transformations can be evaluated for realistic applications

    Overlapping Multi-hop Clustering for Wireless Sensor Networks

    Full text link
    Clustering is a standard approach for achieving efficient and scalable performance in wireless sensor networks. Traditionally, clustering algorithms aim at generating a number of disjoint clusters that satisfy some criteria. In this paper, we formulate a novel clustering problem that aims at generating overlapping multi-hop clusters. Overlapping clusters are useful in many sensor network applications, including inter-cluster routing, node localization, and time synchronization protocols. We also propose a randomized, distributed multi-hop clustering algorithm (KOCA) for solving the overlapping clustering problem. KOCA aims at generating connected overlapping clusters that cover the entire sensor network with a specific average overlapping degree. Through analysis and simulation experiments we show how to select the different values of the parameters to achieve the clustering process objectives. Moreover, the results show that KOCA produces approximately equal-sized clusters, which allows distributing the load evenly over different clusters. In addition, KOCA is scalable; the clustering formation terminates in a constant time regardless of the network size

    DAUGHTERS OF THE NILE: THE EVOLUTION OF FEMINISM IN EGYPT

    Full text link

    Egyptian audience\u27s perception of political satire show: an analysis of the perceived and actual influence of political comedy programming

    Get PDF
    As more Egyptians continue to expose themselves to the political comedy program Al Bernameg, scholars should seek to understand the audience’s perception of the program. This study examines the perception of the Egyptian audience on the political satire show Al Bernameg. The study aims at measuring the perceived bias of Al Bernameg and the perceived and actual influence of the program. This study seeks to explore the influence Al Bernameg has on its audience in relating it to the Third Person effect theory. It examines the difference between the actual purpose for watching Al Bernameg and the perceived purpose. The study employed survey research as its primary source of data collection to investigate the research hypothesis via a purposive sample of 508 Egyptians. The research findings revealed a significant third-person effect pattern for the political satire program, especially among younger viewer

    A generalized approach for the control of micro-electromechanical relays

    Get PDF
    MEMS (Micro-Electromechanical Systems) is an area of research and applications that is becoming increasingly popular. It\u27s mainly concerned with integrating micro-mechanical transducers with micro-electronic circuits on common substrates, traditionally silicon, through micro-fabrication. Instead of traditionally having the transducer and the communicating (or control) circuit as two separate entities, MEMS miniaturizes and combines them on a single chip, giving it several advantages, saving space, money, and increasing the sensitivity and accuracy of the integrated system. A micro-electromechanical relay is a type of MEM devices that is becoming increasingly important in a wide range of industries such as the computer industry, the medical industry and the automotive industry, to name a few. However, micro-relays, both electrostatic and electromagnetic, share a common dynamic structure that causes an unfavorable phenomenon called pull-in in which the movable electrode comes crashing down to the fixed electrode once it reaches a certain gap spacing, possibly damaging the relay and creating undesirable output effects. To eliminate this phenomenon and have better control over the switching of the micro-relays, improving transient response and output error, a feedback control scheme is desired. In this work, it is shown that voltage-controlled electromechanical micro-relays have a common dynamic structure allowing for the formulation of a generalized model. It is also shown that open-loop control of MEM relays naturally leads to pull-in during closing. An attempt has been made to control the relays eliminating this phenomenon and tracking a command signal that dictates the motion of the movable electrode over time with improved transient response. In doing so, two control schemes were adopted, a Lyapunov-based and a feedback linearization-based one. Simulation results clearly show the superiority of the closed-loop control compared to the open-loop one. It\u27s also shown that the Lyapunov-based controller was limited in the extent to which it improved the transient response and that the feedback linearization-based controller performed much better. The latter eliminated pull-in and significantly lowered transient response and settling times, leading to very good tracking of the command signal

    THE IMPACT OF BIG DATA ANALYTICS ON IMPROVING FINANCIAL REPORTING QUALITY

    Get PDF
    Purpose – The current study aims to clarify the importance of big data analytics and its role in changing the accounting profession and the roles of accountants, in addition to testing the impact of big data analytics on improving financial reporting quality in the Saudi environment. Design/ methodology/ approach – To achieve the study's goals and validate hypotheses, relevant previous literature and research are referred. Also, a field study is conducted by distributing a questionnaire of (154) individual academics, financial analysts, accountants, and experts in the field of analyzing big data in the Kingdom of Saudi in 2019. Data are analyzed by using the program of Statistical Package for Social Science (SPSS 17.0). Findings – The study concluded that although business organizations face several challenges when analyzing data, big data analytics has a significant role in achieving high competitiveness for institutions, improving the accounting information quality, providing appropriate information that helps in rationalizing decisions within the economic unit, and providing future information affecting stakeholder's decisions. The study also has proved that there is a statistically significant effect of big data analytics on improving the quality of accounting information, as big data analytics clearly affects the characteristics of the accounting information quality, positively affecting the quality of financial reports. Originality/ Value – Originality/ Value – The analytics of big data is one of the most important topics where it positively affects the improvement of accounting information quality, which reflects on financial reporting quality. Hence, academics and institutions should pay attention to this topic and follow their new ideas. The present study is one of the first studies that deal with this topic and examine the relationship between big data analytics and the characteristics of accounting information which positively affecting financial reporting quality

    Analysis of Neurovirulence in the Mouse Model System Using Deletion Variants of Herpes Simplex Virus Type 2 (HSV-2)

    Get PDF
    The aim of the work described in this thesis was to identify gene (s) involved in determining the neurovirulence of HSV-2 strain HG52 in the mouse model system using deletion variants. The availability of variants with deletions in specific regions of the genome afforded the unique opportunity to determine the possible role of specific sequences in virulence. The phenotype of the parental wild type virus has also been determined by examining the neurovirulence of individual plaque stocks to identify the baseline from which to evaluate the deletion variants. Twenty well separated plaques were picked from the elite stock of HG52 and passaged twice at 37C. Restriction endonuclease analysis of the DNA from each of the twenty plaque stocks showed no differences in the size of fragments and distribution of the sites. To determine their neurovirulence, ten of the individual plaque stocks were selected randomly for mice inoculation. Following intracranial inoculation of 3 week old BALB/c mice, the plaque stocks segregated into three classes of neurovirulence on the basis of their LD50 values; high (10 3 pfu/mouse), intermediate (10e3-10e4 pfu/mouse) and low virulence (>10 5pfu/mouse). The particle : pfu ratios of the plaque stocks were within the acceptable range for HSV-2. Restriction endonuclease analysis of viruses reisolated from the brains of infected mice showed no apparent difference in their DNA profiles compared to the initial infecting viruses. Two plaque stocks of high, one of intermediate and one of low virulence were selected on the basis of their LD50 values and particle : pfu ratios. These stocks retained their original values compared to the non-plaque purified elite stock of HG52 (10 2pfu/mouse) when retested in mice. Following intraperitoneal inoculation, the selected plaque stocks showed differences in their LD50 values comparable to the differences seen following intracranial inoculation. The selected plaque stocks grew as well as the parental HG52 in one step growth experiments in BHK-21 C13 cells. However, they showed differences in the growth kinetics in vivo in mouse brain where the high virulence stocks showed comparable growth to HG52, while the intermediate and low virulence stocks grew less well. The selected plaque stocks were passaged five times in BHK-21 C13 cells and ten plaques from each were picked and stocks grown. Restriction endonuclease analysis of their DNA showed no differences compared to the wild type HG52. Following intracranial inoculation of the virus stock derived from the fifth passage, those derived from intermediate or low virulence virus remained stably of intermediate or low virulence, while those derived from high virulence virus showed in some cases a shift to intermediate levels of virulence. These results clearly demonstrate virulence heterogeneity within the elite stock of HSV-2 strain HG52. The LD50 value of high virulence virus was chosen as a baseline from which to evaluate the virulence of deletion variants A number of deletion variants of HSV-2 strain HG52 with deletions ranging in size from 1.5 to 9 kb have been tested for virulence following intracranial inoculation of 3 week old BALB/c mice. The variants were segregated into three categories of virulence on the basis of their LD50 values; avirulent (>10e7 pfu/mouse), reduced (10 -10 pfu/mouse) and attenuated (10 pfu/mouse) compared to 10 pfu/mouse for the elite stock of HG52. Analysis of the variants with deletions in U S /TR S indicated that the US4, US10, US11 and US12 genes have a role in neurovirulence. Deletion of one copy of oris and one copy of the IE3 gene has a minor effect on neurovirulence. Analysis of the variants with deletions in IR L/TR L regions of the genome implies that deletion of one copy of the IEl gene and part of LAT transcripts in IR L has a minor effect on neurovirulence. The analysis demonstrated that none of the deleted genes appear to be a unique determinant of neurovirulence with the exception of the DNA sequences between 0-0.02 and 0.81-0 83 m. u. The variant JH2604 whose genome is deleted by 1.5 kb in both copies of the BamHI v fragment between 0-0.02 and 0.81-0.83 m. u. in TR L and IR L respectively was avirulent for mice following intracranial and footpad inoculation with LD 50 values of >10e7 and >10e8 pfu/mouse respectively compared to 10 pfu/mouse for the elite stock of HG52. Therefore, the variant JH2604 is at least 6 logs less neurovirulent than the wild type virus. The variant JH2604 grew as well as the wild type virus in vitro in BHK-21 C13 and 3T6 cells, but it failed to grow in mouse brain in vivo demonstrating that the lack of neurovirulence was due to inefficient replication in mouse brain

    FOUNDING SCIENCE INSTITUTES: THE TRIPLE-HELIX AND HOW THE GLOBAL INSTITUTE FOR FOOD SECURITY WAS FOUNDED

    Get PDF
    In addition to teaching and research, many universities around the world have started to assume a direct role in economic development. In the literature, this trend is referred to as creating the entrepreneur university. Focusing on the interaction among government, business, and academia, the triple-helix theory is used frequently by contemporary social scientists to analyze the processes of creating the entrepreneur university. When reviewing the literature and reading about the triple-helix theory, I realized that a study of a contemporary and global institute, intended from the beginning to function through the interaction of government, business, and academia, and including informants and participants’ perceptions was needed. I posed my research question as follows: Does the triple-helix theory explain the factors, motivations, and social processes that led to the creation of the University of Saskatchewan’s Global Institute for Food Security (GIFS)? In order to answer this question, I conducted interviews with key academics, businesses, and government actors, gathered archival documents and media reports, and used qualitative data analysis and triangulation. My research findings indicate that the role of the industry in creating the GIFS is strong and that the GIFS embodies the new policy of the University, which as recommended by supporters of commercialization can be summarized with the following four points: improving signature areas, improving the position of the University within university rankings, increasing central planning, and attracting private funding and partnerships. Furthermore, research findings indicate that, for the most part, the triple-helix theory does not help in explaining how the GIFS was founded, as it does not problematize power relations and it appraises the status quo
    • …
    corecore